Function Calling & Tool Systems — Session 3
2026-02-16
Vendor lock-in and the need for a standard
Every framework invented its own tool integration:
The Problem
Write a tool once, rewrite it for every framework. No portability, no ecosystem.
🔌 Proprietary Cables
MCP = one standard protocol, any client can connect to any server.
| Component | Role |
|---|---|
| Host | The app where the AI lives (Claude, Cursor, your agent) |
| Client | The connector inside the host that speaks MCP |
| Server | Your code — exposes tools, resources, and prompts |
Building with FastMCP
FastMCP makes it as easy as FastAPI:
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("Research Assistant Tools")
@mcp.tool()
def calculate(operation: str, operand_a: float, operand_b: float) -> dict:
"""Executes a calculation using the internal registry."""
return registry.execute("execute_calculation", {
"operation": operation,
"operand_a": operand_a,
"operand_b": operand_b
})
if __name__ == "__main__":
mcp.run() # Runs over stdio by default@mcp.tool() — registers a function as an MCP toolThe key insight: your ToolRegistry is the shared kernel. MCP is just one interface.
MCP tools delegate to registry.execute() — all rate limiting, permissions, and error handling still apply.
import asyncio
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client
async def run_agent():
server_params = StdioServerParameters(
command="python", args=["server.py"]
)
async with stdio_client(server_params) as (read, write):
async with ClientSession(read, write) as session:
await session.initialize()
# Discover tools
tools = await session.list_tools()
# Call a tool
result = await session.call_tool(
"calculate",
{"operation": "add", "operand_a": 10, "operand_b": 5}
)
print(result.content[0].text)
asyncio.run(run_agent())sequenceDiagram
participant C as Client (simple_agent.py)
participant S as Server (server.py)
C->>S: spawn process (python server.py)
C->>S: initialize request (via stdin)
S-->>C: initialize response (via stdout)
C->>S: list_tools request
S-->>C: tools list response
C->>S: call_tool("calculate", args)
S-->>C: tool result
Stdio transport: client spawns server as a subprocess. They communicate via stdin/stdout pipes. No network, no ports — simple and secure for local use.
Active agency vs passive context
| Tools | Resources | |
|---|---|---|
| Nature | Active — do things | Passive — read things |
| Examples | Calculate, send email, query DB | Logs, config files, data feeds |
| LLM interaction | Model requests execution | Model reads for context |
| Token cost | Full tool-call cycle | Direct context injection |
| Decorator | @mcp.tool() |
@mcp.resource() |
The client can read system://logs/recent anytime — no tool call overhead.
Prompts save best-practice prompt engineering into the server — users get consistent, high-quality interactions.
| Feature | Custom Registry (Session 2) | MCP Server (Session 3) |
|---|---|---|
| Control | Total — you own the loop | Limited by the host/client |
| Interoperability | Low — works only with your code | High — any MCP client |
| Complexity | Higher — build the chat loop | Lower — just define functions |
| Best for | Standalone products/apps | Plugins, extensions, integrations |
Build your core logic in the Tool Registry (domain layer). Then expose it via multiple interfaces:
┌─── MCP Server ──── Claude, Cursor, IDEs
│
ToolRegistry ─────┼─── REST API ───── Web Frontend
(shared kernel) │
└─── CLI ─────────── Admin Scripts
Your registry.py is the shared kernel. MCP is just one view into it.
Up Next
Lab 4 — MCP Server: Wrap your registry in a FastMCP server, add a resource, and build a client that connects over stdio.